- Title
- Accelerating federated edge learning
- Creator
- Nguyen, Tuan Dung; Balef, Amir R.; Dinh, Canh T.; Tran, Nguyen H.; Ngo, Duy T.; Anh Le, Tuan Anh; Vo, Phuong L.
- Relation
- IEEE Communications Letters Vol. 25, Issue 10, p. 3282-3286
- Publisher Link
- http://dx.doi.org/10.1109/LCOMM.2021.3103536
- Publisher
- Institute of Electrical and Electronics Engineers (IEEE)
- Resource Type
- journal article
- Date
- 2021
- Description
- Transferring large models in federated learning (FL) networks is often hindered by clients’ limited bandwidth. We propose FedAA, an FL algorithm which achieves fast convergence by exploiting the regularized Anderson acceleration (AA) on the global level. First, we demonstrate that FL can benefit from acceleration methods in numerical analysis. Second, FedAA improves the convergence rate for quadratic losses and improves the empirical performance for smooth and strongly convex objectives, compared to FedAvg, an FL algorithm using gradient descent (GD) local updates. Experimental results demonstrate that employing AA can significantly improve the performance of FedAvg, even when the objective is non-convex.
- Subject
- Anderson acceleration; distributed optimization; federated learning; convergence
- Identifier
- http://hdl.handle.net/1959.13/1436848
- Identifier
- uon:40157
- Identifier
- ISSN:1089-7798
- Language
- eng
- Reviewed
- Hits: 1616
- Visitors: 1603
- Downloads: 0
Thumbnail | File | Description | Size | Format |
---|